Orthogonal Nonlinear Least-Squares Regression in R
نویسنده
چکیده
Orthogonal nonlinear least squares (ONLS) regression is a not so frequently applied and largely overlooked regression technique that comes into question when one encounters an ”error in variables” problem. While classical nonlinear least squares (NLS) aims to minimize the sum of squared vertical residuals, ONLS minimizes the sum of squared orthogonal residuals. The method is based on finding points on the fitted line that are orthogonal to the data by minimizing for each (xi, yi) the Euclidean distance ‖Di‖ to some point (x0i, y0i) on the fitted curve. There is a 25 year old FORTRAN implementation for ONLS available (ODRPACK, http://www.netlib.org/toms/869.zip), which has been included in the ’scipy’ package for Python (http://docs.scipy.org/doc/scipy-0.14.0/reference/odr.html). Here, onls has been developed for easy future algorithm tweaking in R. The results obtained from onls are exactly similar to those found in [1, 4]. The implementation is based on an inner loop using optimize for each (xi, yi) to find min ‖Di‖ within some border [xi−w, xi+w] and an outer loop for the fit parameters using nls.lm of the ’minpack’ package.
منابع مشابه
Identification of nonlinear systems with non-persistent excitation using an iterative forward orthogonal least squares regression algorithm
A new iterative orthogonal least squares forward regression (iOFR) algorithm is proposed to identify nonlinear systems which may not be persistently excited. By slightly revising the classic forward orthogonal regression (OFR) algorithm, the new iterative algorithm provides search solutions on a global solution space. Examples show that the new iterative algorithm is computationally efficient a...
متن کاملAutomatic Kernel Regression Modelling Using Combined Leave-One-Out Test Score and Regularised Orthogonal Least Squares
This paper introduces an automatic robust nonlinear identification algorithm using the leave-one-out test score also known as the PRESS (Predicted REsidual Sums of Squares) statistic and regularised orthogonal least squares. The proposed algorithm aims to achieve maximised model robustness via two effective and complementary approaches, parameter regularisation via ridge regression and model op...
متن کاملRobust nonlinear model identification methods using forward regression
In this correspondence new robust nonlinear model construction algorithms for a large class of linear-in-the-parameters models are introduced to enhance model robustness via combined parameter regularization and new robust structural selective criteria. In parallel to parameter regularization, we use two classes of robust model selection criteria based on either experimental design criteria tha...
متن کاملFully complex-valued radial basis function networks: Orthogonal least squares regression and classification
We consider a fully complex-valued radial basis function (RBF) network for regression and classification applications. For regression problems, the locally regularised orthogonal least squares (LROLS) algorithm aided with the D-optimality experimental design, originally derived for constructing parsimonious real-valued RBF models, is extended to the fully complex-valued RBF (CVRBF) network. Lik...
متن کاملLocal regularization assisted orthogonal least squares regression
A locally regularized orthogonal least squares (LROLS) algorithm is proposed for constructing parsimonious or sparse regression models that generalize well. By associating each orthogonal weight in the regression model with an individual regularization parameter, the ability for the orthogonal least squares model selection to produce a very sparse model with good generalization performance is g...
متن کامل